Full Post
Posted Thursday, May 29, 2008 7:28 PM

Mind Reading: Another Step Closer

Sharon Begley

If things keep going like this, it will be as easy to pick up someone’s thoughts as it is their cell-phone conversation. It’s been only a few months since I wrote about scientists who had trained a computer to distinguish thoughts:

“Scientists at Carnegie Mellon University showed people drawings of five tools hammer, drill and the like) and five dwellings (castle, igloo …) and asked them to think about each object’s properties, uses and anything else that came to mind. Meanwhile, fMRI measured activity throughout each volunteer’s brain. As the scientists report this month in the journal PLoS One, the activity pattern evoked by each object was so distinctive that the computer could tell with 78 percent accuracy when someone was thinking about a hammer and not, say, pliers. . . . Remarkably, the activity patterns—from visual areas to movement area to regions that encode abstract ideas like the feudal associations of a castle—were eerily similar from one person to another. 'This establishes, as never before, that there is a commonality in how different people’s brains represent the same object,' said CMU’s Tom Mitchell."


Now the same team, led by computer scientist Mitchell and cognitive neuroscientist Marcel Just, has taken the next step. As they report in tomorrow’s issue of the journal Science, they can now identify the unique brain activation patterns associated with scores of concrete nouns—that is, names for things you can see, hear, feel, taste or smell.

Advertisement

The team started with the fMRI patterns for 60 concrete nouns, including words for animals, body parts, buildings, clothing, insects, vehicles and vegetables. They then had their computer statistically analyze texts totaling more than 1 trillion words. For each noun, it calculated how frequently it occurs in the text with any of 25 verbs that have sensory or movement meanings, such as see, hear, listen, taste, smell, eat, push, drive and lift. The computer model combined the two sets of data to predict the activation patterns for thousands of concrete nouns, and racked up an accuracy of 77 percent.

“We believe we have identified a number of the basic building blocks that the brain uses to represent meaning,” said Mitchell in a statement. “Coupled with computational methods that capture the meaning of a word by how it is used in text files, these building blocks can be assembled to predict neural activation patterns for any concrete noun.”

A big reason this seems to work is that, as Just explained, people “are fundamentally perceivers and actors. So the brain represents the meaning of a concrete noun in areas associated with how people sense it or manipulate it. The meaning of an apple, for instance, is represented in brain areas responsible for tasting, for smelling, for chewing. An apple is what you do with it.”

Besides sensory and motor areas, brain regions that become active when people think of concrete nouns include frontal areas, which plan and also encode long-term memory. Thoughts of apple, it seems, triggers a remembrance of apples past, and also of thoughts about how to get one.

That’s why the model struggles to distinguish apple from pear, for instance, and it also suggests that words other than concrete nouns—by, under, over, around, soon, later, enormous and every other preposition, adjective and adverb, not to mention verbs—will be tough for the computer to read from patterns of brain activity. Yet these words are, needless to say, crucial to determining meaning. The computer can’t tell It’s time to smuggle the bomb into the harbor from It’s criminal how poorly the harbor is defended against bombs. But at the rate scientists are going, it might not be long before it can.

You must be a registered user to comment.  Click here to register.  Already a user?  Click here to login.

Member Comments

No Comments
 
The Peek
INNOVATION

Shutterfly is the little Silicon Valley company that could. It survived the dotcom bust and now competes with two behemoths in the online photo industry.

Sponsored by
Sponsored by
loadingLoading Menu